<?xml version="1.0" encoding="ISO-8859-1"?>
<metadatalist>
	<metadata ReferenceType="Conference Proceedings">
		<site>sibgrapi.sid.inpe.br 802</site>
		<identifier>8JMKD3MGPEW34M/47JU645</identifier>
		<repository>sid.inpe.br/sibgrapi/2022/09.10.19.35</repository>
		<lastupdate>2022:09.10.19.35.42 sid.inpe.br/banon/2001/03.30.15.38 arbackes@yahoo.com.br</lastupdate>
		<metadatarepository>sid.inpe.br/sibgrapi/2022/09.10.19.35.42</metadatarepository>
		<metadatalastupdate>2023:05.23.04.20.42 sid.inpe.br/banon/2001/03.30.15.38 administrator {D 2022}</metadatalastupdate>
		<doi>10.1109/SIBGRAPI55357.2022.9991771</doi>
		<citationkey>Backes:2022:PaImCl</citationkey>
		<title>Pap-smear image classification by using a fusion of texture features</title>
		<shorttitle>Pap-smear image classification by using a fusion of texture features</shorttitle>
		<format>On-line</format>
		<year>2022</year>
		<numberoffiles>1</numberoffiles>
		<size>644 KiB</size>
		<author>Backes, André Ricardo,</author>
		<affiliation>School of Computer Science, Federal University of Uberlândia</affiliation>
		<e-mailaddress>arbackes@yahoo.com.br</e-mailaddress>
		<conferencename>Conference on Graphics, Patterns and Images, 35 (SIBGRAPI)</conferencename>
		<conferencelocation>Natal, RN</conferencelocation>
		<date>24-27 Oct. 2022</date>
		<booktitle>Proceedings</booktitle>
		<tertiarytype>Full Paper</tertiarytype>
		<transferableflag>1</transferableflag>
		<keywords>texture analysis, PSO, pap-smear, image classification.</keywords>
		<abstract>In this paper we address the problem of pap-smear image classification. These images have great medical importance to diagnose and prevent uterine cervix cancer and have been intensively studied in computer vision research. We evaluated 19 texture features on their ability to discriminate between two classes (normal and abnormal) of pap-smear images. We performed the classification of these feature using three different approaches: K-Nearest Neighbors (KNN), Support Vector Machine (SVM) and Linear Discriminant Data (LDA). We conducted this evaluation considering each texture method independently and their concatenation with others. Results show combining methods improves the accuracy, surpassing most of the compared methods, including some deep learning approaches.</abstract>
		<language>en</language>
		<targetfile>backes_16.pdf</targetfile>
		<usergroup>arbackes@yahoo.com.br</usergroup>
		<visibility>shown</visibility>
		<mirrorrepository>sid.inpe.br/banon/2001/03.30.15.38.24</mirrorrepository>
		<nexthigherunit>8JMKD3MGPEW34M/495MHJ8</nexthigherunit>
		<citingitemlist>sid.inpe.br/sibgrapi/2023/05.19.12.10 14</citingitemlist>
		<citingitemlist>sid.inpe.br/sibgrapi/2022/06.10.21.49 1</citingitemlist>
		<hostcollection>sid.inpe.br/banon/2001/03.30.15.38</hostcollection>
		<agreement>agreement.html .htaccess .htaccess2</agreement>
		<lasthostcollection>sid.inpe.br/banon/2001/03.30.15.38</lasthostcollection>
		<url>http://sibgrapi.sid.inpe.br/rep-/sid.inpe.br/sibgrapi/2022/09.10.19.35</url>
	</metadata>
</metadatalist>